skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Schiavazzi, Daniele E"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Free, publicly-accessible full text available November 1, 2026
  2. Estimation of cardiovascular model parameters from electronic health records (EHRs) poses a significant challenge primarily due to lack of identifiability. Structural non-identifiability arises when a manifold in the space of parameters is mapped to a common output, while practical non-identifiability can result due to limited data, model misspecification or noise corruption. To address the resulting ill-posed inverse problem, optimization-based or Bayesian inference approaches typically use regularization, thereby limiting the possibility of discovering multiple solutions. In this study, we use inVAErt networks, a neural network-based, data-driven framework for enhanced digital twin analysis of stiff dynamical systems. We demonstrate the flexibility and effectiveness of inVAErt networks in the context of physiological inversion of a six-compartment lumped‐parameter haemodynamic model from synthetic data to real data with missing components. This article is part of the theme issue ‘Uncertainty quantification for healthcare and biological systems (Part 2)’. 
    more » « less
    Free, publicly-accessible full text available April 2, 2026
  3. When predicting physical phenomena through simulation, quantification of the total uncertainty due to multiple sources is as crucial as making sure the underlying numerical model is accurate. Possible sources include irreduciblealeatoricuncertainty due to noise in the data,epistemicuncertainty induced by insufficient data or inadequate parameterization andmodel-formuncertainty related to the use of misspecified model equations. In addition, recently proposed approaches provide flexible ways to combine information from data with full or partial satisfaction of equations that typically encode physical principles. Physics-based regularization interacts in non-trivial ways with aleatoric, epistemic and model-form uncertainty and their combination, and a better understanding of this interaction is needed to improve the predictive performance of physics-informed digital twins that operate under real conditions. To better understand this interaction, with a specific focus on biological and physiological models, this study investigates the decomposition of total uncertainty in the estimation of states and parameters of a differential system simulated with MC X-TFC, a new physics-informed approach for uncertainty quantification based on random projections and Monte Carlo sampling. After an introductory comparison between approaches for physics-informed estimation, MC X-TFC is applied to a six-compartment stiff ODE system, the CVSim-6 model, developed in the context of human physiology. The system is first analysed by progressively removing data while estimating an increasing number of parameters, and subsequently by investigating total uncertainty under model-form misspecification of nonlinear resistance in the pulmonary compartment. In particular, we focus on the interaction between the formulation of the discrepancy term and quantification of model-form uncertainty, and show how additional physics can help in the estimation process. The method demonstrates robustness and efficiency in estimating unknown states and parameters, even with limited, sparse and noisy data. It also offers great flexibility in integrating data with physics for improved estimation, even in cases of model misspecification. This article is part of the theme issue ‘Uncertainty quantification for healthcare and biological systems (Part 1)’. 
    more » « less
    Free, publicly-accessible full text available March 13, 2026
  4. Bayesian boundary condition (BC) calibration approaches from clinical measurements have successfully quantified inherent uncertainties in cardiovascular fluid dynamics simulations. However, estimating the posterior distribution for all BC parameters in three-dimensional (3D) simulations has been unattainable due to infeasible computational demand. We propose an efficient method to identify Windkessel parameter posteriors: We only evaluate the 3D model once for an initial choice of BCs and use the result to create a highly accurate zero-dimensional (0D) surrogate. We then perform Sequential Monte Carlo (SMC) using the optimized 0D model to derive the high-dimensional Windkessel BC posterior distribution. Optimizing 0D models to match 3D dataa priorilowered their median approximation error by nearly one order of magnitude in 72 publicly available vascular models. The optimized 0D models generalized well to a wide range of BCs. Using SMC, we evaluated the high-dimensional Windkessel parameter posterior for different measured signal-to-noise ratios in a vascular model, which we validated against a 3D posterior. The minimal computational demand of our method using a single 3D simulation, combined with the open-source nature of all software and data used in this work, will increase access and efficiency of Bayesian Windkessel calibration in cardiovascular fluid dynamics simulations. This article is part of the theme issue ‘Uncertainty quantification for healthcare and biological systems (Part 1)’. 
    more » « less
    Free, publicly-accessible full text available March 13, 2026
  5. Fast inference of numerical model parameters from data is an important prerequisite to generate predictive models for a wide range of applications. Use of sampling-based approaches such as Markov chain Monte Carlo may become intractable when each likelihood evaluation is computationally expensive. New approaches combining variational inference with normalizing flow are characterized by a computational cost that grows only linearly with the dimensionality of the latent variable space, and rely on gradient-based optimization instead of sampling, providing a more efficient approach for Bayesian inference about the model parameters. Moreover, the cost of frequently evaluating an expensive likelihood can be mitigated by replacing the true model with an offline trained surrogate model, such as neural networks. However, this approach might generate significant bias when the surrogate is insufficiently accurate around the posterior modes. To reduce the computational cost without sacrificing inferential accuracy, we propose Normalizing Flow with Adaptive Surrogate (NoFAS), an optimization strategy that alternatively updates the normalizing flow parameters and surrogate model parameters. We also propose an efficient sample weighting scheme for surrogate model training that preserves global accuracy while effectively capturing high posterior density regions. We demonstrate the inferential and computational superiority of NoFAS against various benchmarks, including cases where the underlying model lacks identifiability. The source code and numerical experiments used for this study are available at https://github.com/cedricwangyu/NoFAS. 
    more » « less
  6. Approximating probability distributions can be a challenging task, particularly when they are supported over regions of high geometrical complexity or exhibit multiple modes. Annealing can be used to facilitate this task which is often combined with constant a priori selected increments in inverse temperature. However, using constant increments limits the computational efficiency due to the inability to adapt to situations where smooth changes in the annealed density could be handled equally well with larger increments. We introduce AdaAnn, an adaptive annealing scheduler that automatically adjusts the temperature increments based on the expected change in the Kullback-Leibler divergence between two distributions with a sufficiently close annealing temperature. AdaAnn is easy to implement and can be integrated into existing sampling approaches such as normalizing flows for variational inference and Markov chain Monte Carlo. We demonstrate the computational efficiency of the AdaAnn scheduler for variational inference with normalizing flows on a number of examples, including posterior estimation of parameters for dynamical systems and probability density approximation in multimodal and high-dimensional settings. 
    more » « less